Fast spectral algorithms from sum-of-squares proofs

 

 

Sam Hopkins

Monday, March 21, 2016
4:00pm 310 Gates Hall

Abstract:

There is a long history in algorithm design of using relatively slow (but still polynomial-time) convex-relaxation-based algorithms as springboards to devise very fast algorithms which exploit the structure of convex relaxations without explicitly solving them. (For example, the primal-dual, multiplicative weights, and matrix multiplicative weights techniques all exploit the existence of an LP or SDP without paying the cost in running time to solve it.)

The sum-of-squares hierarchy is a particularly powerful family of convex programs, generalizing LPs and SDPs, which have recently shown promise to provide algorithmic guarantees which are out of reach for LPs and SDPs. Unfortunately, solving sum-of-squares programs carries an even higher running-time cost than solving traditional LP and SDP relaxations. We introduce new techniques to design very fast algorithms (often running in nearly-linear time) which exploit the existence of sum-of-squares-based algorithms without paying to solve the sum-of-squares program. We show that the resulting algorithms (nearly) match the performance of sum-of-squares on an array of problems from machine learning where sum-of-squares is known to outperform LP and SDP-based algorithms.

Joint work with Jonathan Shi, Tselil Schramm, and David Steurer. To appear in STOC 2016.